How Inhibitory Oscillations Can Train Neural Networks and Punish Competitors

نویسندگان

  • Kenneth A. Norman
  • Ehren L. Newman
  • Greg Detre
  • Sean M. Polyn
چکیده

We present a new learning algorithm that leverages oscillations in the strength of neural inhibition to train neural networks. Raising inhibition can be used to identify weak parts of target memories, which are then strengthened. Conversely, lowering inhibition can be used to identify competitors, which are then weakened. To update weights, we apply the Contrastive Hebbian Learning equation to successive time steps of the network. The sign of the weight change equation varies as a function of the phase of the inhibitory oscillation. We show that the learning algorithm can memorize large numbers of correlated input patterns without collapsing and that it shows good generalization to test patterns that do not exactly match studied patterns.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A neural mass model of CA1-CA3 neural network and studying sharp wave ripples

We spend one third of our life in sleep. The interesting point about the sleep is that the neurons are not quiescent during sleeping and they show synchronous oscillations at different regions. Especially sharp wave ripples are observed in the hippocampus. Here, we propose a simple phenomenological neural mass model for the CA1-CA3 network of the hippocampus considering the spike frequency adap...

متن کامل

Neural and cognitive modeling with networks of leaky integrator units

After reviewing several physiological findings on oscillations in the electroencephalogram (EEG) and their possible explanations by dynamical modeling, we present neural networks consisting of leaky integrator units as an universal paradigm for neural and cognitive modeling. In contrast to standard recurrent neural networks, leaky integrator units are described by ordinary differential equation...

متن کامل

Self-Consistent Scheme for Spike-Train Power Spectra in Heterogeneous Sparse Networks

Recurrent networks of spiking neurons can be in an asynchronous state characterized by low or absent cross-correlations and spike statistics which resemble those of cortical neurons. Although spatial correlations are negligible in this state, neurons can show pronounced temporal correlations in their spike trains that can be quantified by the autocorrelation function or the spike-train power sp...

متن کامل

Some Results on the Oscillation of Neural Networks

Neural oscillators consisting of formal neurons are designed to model the intrinsic oscillations of biological neural networks. The smallest oscillating network without self-connections is a three neuron ring network formed by two excitatory neurons and one inhibitory neuron. More sophisticated oscillators can be formed by coupling ring networks. Two coupled rings exhibit phase-locking. They ma...

متن کامل

Oscillatory activity in the neural networks of spiking elements.

We study the dynamics of activity in the neural networks of enhanced integrate-and-fire elements (with random noise, refractory periods, signal propagation delay, decay of postsynaptic potential, etc.). We consider the networks composed of two interactive populations of excitatory and inhibitory neurons with all-to-all or random sparse connections. It is shown by computer simulations that the r...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • Neural computation

دوره 18 7  شماره 

صفحات  -

تاریخ انتشار 2006